Goto

Collaborating Authors

 ann arbor


Cyber Racing Coach: A Haptic Shared Control Framework for Teaching Advanced Driving Skills

Shen, Congkai, Yu, Siyuan, Weng, Yifan, Ma, Haoran, Li, Chen, Yasuda, Hiroshi, Dallas, James, Thompson, Michael, Subosits, John, Ersal, Tulga

arXiv.org Artificial Intelligence

Abstract--This study introduces a haptic shared control framework designed to teach human drivers advanced driving skills. In this context, shared control refers to a driving mode where the human driver collaborates with an autonomous driving system to control the steering of a vehicle simultaneously. Advanced driving skills are those necessary to safely push the vehicle to its handling limits in high-performance driving such as racing and emergency obstacle avoidance. Previous research has demonstrated the performance and safety benefits of shared control schemes using both subjective and objective evaluations. However, these schemes have not been assessed for their impact on skill acquisition on complex and demanding tasks. Prior research on long-term skill acquisition either applies haptic shared control to simple tasks or employs other feedback methods like visual and auditory aids. T o bridge this gap, this study creates a cyber racing coach framework based on the haptic shared control paradigm and evaluates its performance in helping human drivers acquire high-performance driving skills. The framework introduces (1) an autonomous driving system that is capable of cooperating with humans in a highly performant driving scenario; and (2) a haptic shared control mechanism along with a fading scheme to gradually reduce the steering assistance from autonomy based on the human driver's performance during training. Two benchmarks are considered: self-learning (no assistance) and full assistance during training. Results from a human subject study indicate that the proposed framework helps human drivers develop superior racing skills compared to the benchmarks, resulting in better performance and consistency. Advanced driving skills refer to a set of competencies that go beyond basic driving abilities in terms of situational awareness, hazard perception, risk management, and vehicle handling [1]. They are crucial in high-performance driving tasks such as racing, and can also improve safety in everyday driving [1], [2]. This work has been submitted to the IEEE for possible publication.


PUBLICSPEAK: Hearing the Public with a Probabilistic Framework in Local Government

Xu, Tianliang, Brown, Eva Maxfield, Dwyer, Dustin, Tomkins, Sabina

arXiv.org Artificial Intelligence

Local governments around the world are making consequential decisions on behalf of their constituents, and these constituents are responding with requests, advice, and assessments of their officials at public meetings. So many small meetings cannot be covered by traditional newsrooms at scale. We propose PUBLICSPEAK, a probabilistic framework which can utilize meeting structure, domain knowledge, and linguistic information to discover public remarks in local government meetings. We then use our approach to inspect the issues raised by constituents in 7 cities across the United States. We evaluate our approach on a novel dataset of local government meetings and find that PUBLICSPEAK improves over state-of-the-art by 10% on average, and by up to 40%.


ROB 204: Introduction to Human-Robot Systems at the University of Michigan, Ann Arbor

Stirling, Leia, Montgomery, Joseph, Draelos, Mark, Mavrogiannis, Christoforos, Robert, Lionel P. Jr., Jenkins, Odest Chadwicke

arXiv.org Artificial Intelligence

The University of Michigan Robotics program focuses on the study of embodied intelligence that must sense, reason, act, and work with people to improve quality of life and productivity equitably across society. ROB 204, part of the core curriculum towards the undergraduate degree in Robotics, introduces students to topics that enable conceptually designing a robotic system to address users' needs from a sociotechnical context. Students are introduced to human-robot interaction (HRI) concepts and the process for socially-engaged design with a Learn-Reinforce-Integrate approach. In this paper, we discuss the course topics and our teaching methodology, and provide recommendations for delivering this material. Overall, students leave the course with a new understanding and appreciation for how human capabilities can inform requirements for a robotics system, how humans can interact with a robot, and how to assess the usability of robotic systems.


A Conceptual Framework for Conversational Search and Recommendation: Conceptualizing Agent-Human Interactions During the Conversational Search Process

Azzopardi, Leif, Dubiel, Mateusz, Halvey, Martin, Dalton, Jeffery

arXiv.org Artificial Intelligence

While past work has started to tease out different actions that users and agents The conversational search task aims to enable a user to resolve perform and respond to during the conversational search process, information needs via natural language dialogue with there has been little work on formalizing these actions an agent. In this paper, we aim to develop a conceptual and decisions. Thus the goal of this paper is to develop a framework of the actions and intents of users and agents conceptual framework of different actions and intents, along explaining how these actions enable the user to explore the with the key decision points within the conversation. Our search space and resolve their information need. We outline aim is to make these tasks explicit in order to formalize the different actions and intents, before discussing key decision the research, development and evaluation of conversational points in the conversation where the agent needs to search agents. To this end, we first examine the key actions decide how to steer the conversational search process to a successful and intents identified in past work, and enumerate these and/or satisfactory conclusion. Essentially, this paper along with others that can be naturally inferred from a simulated provides a conceptualization of the conversational search process conversational context, before discussing the key decisions between an agent and user, which provides a framework that the agent needs to make in order to advance the and a starting point for research, development and evaluation conversation to a satisfactory or successful end. of conversational search agents.


Autonomy Simulation & Data Analyst (Hybrid or Remote) at May Mobility - Ann Arbor, MI

#artificialintelligence

May Mobility is transforming cities through autonomous technology to create a safer, greener, more accessible world. Based in Ann Arbor, Michigan, May develops and deploys autonomous vehicles (AVs) powered by our innovative Multi-Policy Decision Making (MPDM) technology that literally reimagines the way AVs think. Our vehicles do more than just drive themselves - they provide value to communities, bridge public transit gaps and move people where they need to go safely, easily and with a lot more fun. We're building the world's best autonomy system to reimagine transit by minimizing congestion, expanding access and encouraging better land use in order to foster more green, vibrant and livable spaces. Since our founding in 2017, we've given more than 300,000 autonomy-enabled rides to real people around the globe.


Improving accuracy of computer vision models, Voxel51 raises $12.5M

#artificialintelligence

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Computer vision AI models rely on having properly labeled data in order to infer the correct object. The challenge of helping to verify that data used for a model is accurate is one that Ann Arbor, Michigan-based startup Voxel51 is aiming to solve with open-source tools and a commercial service called FiftyOne Teams. Ann Arbor is home to the University of Michigan, which is where Voxel51 cofounder and CEO Jason Corso works as a professor, and where he got the idea to build the new company.


Manager, Data Science (Hybrid)

#artificialintelligence

May Mobility is transforming cities through autonomous technology to create a safer, greener, more accessible world. Based in Ann Arbor, Michigan, May develops and deploys autonomous vehicles (AVs) powered by our innovative Multi-Policy Decision Making (MPDM) technology that literally reimagines the way AVs think. Our vehicles do more than just drive themselves - they provide value to communities, bridge public transit gaps and move people where they need to go safely, easily and with a lot more fun. We're building the world's best autonomy system to reimagine transit by minimizing congestion, expanding access and encouraging better land use in order to foster more green, vibrant and livable spaces. Since our founding in 2017, we've given more than 300,000 autonomy-enabled rides to real people around the globe.


Help name one of Ann Arbor's new autonomous shuttles

#artificialintelligence

ANN ARBOR – A free autonomous shuttle service is coming to Ann Arbor in October and providers are turning to the public to name one of the vehicles. The A2GO ride service is a collaboration between May Mobility, University of Michigan's Mcity and Ann Arbor SPARK and will operate around U-M's central campus, the State Street corridor and Kerrytown. May Mobility will operate a fleet of five autonomous shared vehicles, which include four hybrid-electric Lexus RH 450h cars and one Polaris GEM fully electric vehicle, which will have capacity for one wheelchair passenger. "We're looking for a creative, clever and memorable name," reads the contest page. "There's only one rule - the shuttle's name has to start with an'M'!" Name suggestions can be submitted now through Sept. 8. Voting on semifinalists will be open from Sept. 14-23 and the winning name will be announced on Oct. 11 -- the scheduled start date for the service.


Delivery startup Refraction AI raises $4.2M to expand service areas

#artificialintelligence

Refraction AI, a company developing semi-autonomous delivery robots, today announced that it raised $4.2 million in seed funding led by Pillar VC. Refraction says that the proceeds will be used for customer acquisition, geographic expansion, and product development well into the next year. The worsening COVID-19 health crisis in much of the U.S. seems likely to hasten the adoption of self-guided robots and drones for goods transportation. They require disinfection, which companies like Kiwibot, Starship Technologies, and Postmates are conducting manually with sanitation teams. But in some cases, delivery rovers like Refraction's could minimize the risk of spreading disease.


Michigan Envisions Autonomous-Car Lane from Detroit to Ann Arbor

#artificialintelligence

Part of the evolution of self-driving cars is deploying the vehicles in geofenced areas: Instead of putting them out into the entirety of the world, they're kept within a geographic area that has been mapped and determined to work well with the capabilities of an autonomous vehicle. Some of those areas might be special lanes specifically for vehicles that are driven by robots. Michigan is looking into creating such a lane. The state of Michigan and Cavenue (a company founded by Sidewalk Infrastructure Partners, which is part of Alphabet, the parent company of Google) has partnered up to explore building a 40-mile driverless corridor between Detroit and Ann Arbor. The route would be along Michigan Avenue and I-94 and would connect to Detroit Metropolitan Airport, Detroit's Central Station, and the University of Michigan.